StackSeq2Seq: Dual Encoder Seq2Seq Recurrent Networks
نویسندگان
چکیده
A widely studied non-deterministic polynomial time (NP) hard problem lies in nding a route between the two nodes of a graph. Oen meta-heuristics algorithms such asA∗ are employed on graphs with a large number of nodes. Here, we propose a deep recurrent neural network architecture based on the Sequence-2-Sequence (Seq2Seq) model, widely used, for instance in text translation. Particularly, we illustrate that utilising a context vector that has been learned from two dierent recurrent networks enables increased accuracies in learning the shortest route of a graph. Additionally, we show that one can boost the performance of the Seq2Seq network by smoothing the loss function using a homotopy continuation of the decoder’s loss function.
منابع مشابه
Sequence stacking using dual encoder Seq2Seq recurrent networks
A widely studied non-polynomial (NP) hard problem lies in finding a route between the two nodes of a graph. Often meta-heuristics algorithms such asA∗ are employed on graphs with a large number of nodes. Here, we propose a deep recurrent neural network architecture based on the Sequence-2-Sequence model, widely used, for instance in text translation. Particularly, we illustrate that utilising a...
متن کاملRecurrent Neural Network-Based Semantic Variational Autoencoder for Sequence-to-Sequence Learning
Sequence-to-sequence (Seq2seq) models have played an import role in the recent success of various natural language processing methods, such as machine translation, text summarization, and speech recognition. However, current Seq2seq models have trouble preserving global latent information from a long sequence of words. Variational autoencoder (VAE) alleviates this problem by learning a continuo...
متن کاملDRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks
In this work, we present a compact, modular framework for constructing new recurrent neural architectures. Our basic module is a new generic unit, the Transition Based Recurrent Unit (TBRU). In addition to hidden layer activations, TBRUs have discrete state dynamics that allow network connections to be built dynamically as a function of intermediate activations. By connecting multiple TBRUs, we...
متن کاملEnterprise to Computer: Star Trek chatbot
Human interactions and human-computer interactions are strongly influenced by style as well as content. Adding a persona to a chatbot makes it more human-like and contributes to a better and more engaging user experience. In this work, we propose a design for a chatbot that captures the style of Star Trek by incorporating references from the show along with peculiar tones of the fictional chara...
متن کاملTable-to-text Generation by Structure-aware Seq2seq Learning
Table-to-text generation aims to generate a description for a factual table which can be viewed as a set of field-value records. To encode both the content and the structure of a table, we propose a novel structure-aware seq2seq architecture which consists of field-gating encoder and description generator with dual attention. In the encoding phase, we update the cell memory of the LSTM unit by ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017